Euclidean Contractivity of Neural Networks With Symmetric Weights

نویسندگان

چکیده

This paper investigates stability conditions of continuous-time Hopfield and firing-rate neural networks by leveraging contraction theory. First, we present a number useful general algebraic results on matrix polytopes products symmetric matrices. Then, give sufficient for strong weak Euclidean contractivity, i.e., contractivity with respect to the ℓ2 norm, both models weights (possibly) non-smooth activation functions. Our analysis leads rates which are log-optimal in almost all synaptic Finally, use our propose network model solve quadratic optimization problem box constraints.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Neural Networks with Minimal Weights

Linear threshold elements are the basic building blocks of artificial neural networks. A linear threshold element computes a function that is a sign of a weighted sum of the input variables. The weights are arbitrary integers; actually, they can be very big integers-exponential in the number of the input variables. However, in practice, it is difficult to implement big weights. In the present l...

متن کامل

Asymptotic Description of Neural Networks with Correlated Synaptic Weights

We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the s...

متن کامل

On the VC-dimension of neural networks with binary weights

We investigate the VC-dimension of the perceptron and simple two-layer networks like the committeeand the parity-machine with weights restricted to values ±1. For binary inputs, the VC-dimension is determined by atypical pattern sets, i.e. it cannot be found by replica analysis or numerical Monte Carlo sampling. For small systems, exhaustive enumerations yield exact results. For systems that ar...

متن کامل

Capacity of Two-Layer Feedforward Neural Networks with Binary Weights

The lower and upper bounds for the information capacity of two-layer feedforward neural networks with binary interconnections, integer thresholds for the hidden units, and zero threshold for the output unit is obtained through two steps. First, through a constructive approach based on statistical analysis, it is shown that a specifically constructed (N 2L 1) network with N input units, 2L hidde...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Control Systems Letters

سال: 2023

ISSN: ['2475-1456']

DOI: https://doi.org/10.1109/lcsys.2023.3278250